United States Army (Noun) — (military) the army of the United States of America; the agency that organizes and trains soldiers for land warfare.